Convergence and Stability of a Class of Iteratively Re-weighted Least Squares Algorithms for Sparse Signal Recovery in the Presence of Noise.

نویسندگان

  • Behtash Babadi
  • Demba Ba
  • Patrick L Purdon
  • Emery N Brown
چکیده

In this paper, we study the theoretical properties of a class of iteratively re-weighted least squares (IRLS) algorithms for sparse signal recovery in the presence of noise. We demonstrate a one-to-one correspondence between this class of algorithms and a class of Expectation-Maximization (EM) algorithms for constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribution. The IRLS algorithms we consider are parametrized by 0 < ν ≤ 1 and ε > 0. The EM formalism, as well as the connection to GSMs, allow us to establish that the IRLS(ν, ε) algorithms minimize ε-smooth versions of the ℓ ν 'norms'. We leverage EM theory to show that, for each 0 < ν ≤ 1, the limit points of the sequence of IRLS(ν, ε) iterates are stationary point of the ε-smooth ℓ ν 'norm' minimization problem on the constraint set. Finally, we employ techniques from Compressive sampling (CS) theory to show that the class of IRLS(ν, ε) algorithms is stable for each 0 < ν ≤ 1, if the limit point of the iterates coincides the global minimizer. For the case ν = 1, we show that the algorithm converges exponentially fast to a neighborhood of the stationary point, and outline its generalization to super-exponential convergence for ν < 1. We demonstrate our claims via simulation experiments. The simplicity of IRLS, along with the theoretical guarantees provided in this contribution, make a compelling case for its adoption as a standard tool for sparse signal recovery.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

[Proceeding] Fast and Robust EM-Based IRLS Algorithm for Sparse Signal Recovery from Noisy Measurements

In this paper, we analyze a new class of iterative re-weighted least squares (IRLS) algorithms and their effectiveness in signal recovery from incomplete and inaccurate linear measurements. These methods can be interpreted as the constrained maximum likelihood estimation under a two-state Gaussian scale mixture assumption on the signal. We show that this class of algorithms, which performs exac...

متن کامل

Convergence and Stability of Iteratively Re-weighted Least Squares Algorithms for Sparse Signal Recovery in the Presence of Noise

In this paper, we study the theoretical properties of a class of iteratively re-weighted least squares (IRLS) algorithms for sparse signal recovery in the presence of noise. We demonstrate a one-toone correspondence between this class of algorithms and a class of Expectation-Maximization (EM) algorithms for constrained maximum likelihood estimation under a Gaussian scale mixture (GSM) distribut...

متن کامل

Proximal linearized iteratively reweighted least squares for a class of nonconvex and nonsmooth problems

For solving a wide class of nonconvex and nonsmooth problems, we propose a proximal linearized iteratively reweighted least squares (PL-IRLS) algorithm. We first approximate the original problem by smoothing methods, and second write the approximated problem into an auxiliary problem by introducing new variables. PL-IRLS is then built on solving the auxiliary problem by utilizing the proximal l...

متن کامل

Frames for compressed sensing using coherence

We give some new results on sparse signal recovery in the presence of noise, for weighted spaces. Traditionally, were used dictionaries that have the norm equal to 1, but, for random dictionaries this condition is rarely satised. Moreover, we give better estimations then the ones given recently by Cai, Wang and Xu.

متن کامل

Speech enhancement based on hidden Markov model using sparse code shrinkage

This paper presents a new hidden Markov model-based (HMM-based) speech enhancement framework based on the independent component analysis (ICA). We propose analytical procedures for training clean speech and noise models by the Baum re-estimation algorithm and present a Maximum a posterior (MAP) estimator based on Laplace-Gaussian (for clean speech and noise respectively) combination in the HMM ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE transactions on signal processing : a publication of the IEEE Signal Processing Society

دوره 62 1  شماره 

صفحات  -

تاریخ انتشار 2013